Features

The Not-So-New Rule

Understanding 21CFR11: How to make sense of the rules on electronic data

By: Carol Brandt

GMP Compliance Consulting, NNE Pharmaplan

In 1997, when 21CFR11 was signed into law, its scope was sorely misunderstood. The regulation’s premise was to maintain proper identifiers for electronic data relevant to product safety, purity and efficacy. Just as printed documents provided a paper trail through the product life cycle, computer records needed to create a similar electronic trail, identifying users, approvals and progressions that lead back to the original data.

Conceptually there was nothing new about 21CFR11. In essence, it was a response to advances in the tools and equipment being used in the development, production and distribution of a pharmaceutical product. But the tracking of electronic records and signatures was an unknown entity that a majority of firms found intimidating and overwhelming.

Many in the industry chose to avoid 21CFR11 by maintaining their paper trail. The familiarity of that more labor-intensive and error-prone practice, combined with an uncertainty of how to establish and sustain Part 11 compliance, stalled opportunities offered by emerging technology. But fierce competition within the industry has forced a re-evaluation of that approach, bringing Part 11 compliance to the forefront. Of course, concerns of FDA inspection and enforcement have also played a part in this trend.

Avoidance was never a true option, as long as technology exists within an organization. In fact, there is no grandfather clause in Part 11, so legacy systems fall under its compliance requirements. These older systems actually pose a more difficult problem than new technology. Legacy systems were often designed to overwrite data, virtually eliminating the electronic trail, and their encryption protocols and user signature capabilities rarely meet suggested guidelines. When these systems are integrated with others, creating a hybrid system housing both cGMP and non-cGMP functions, compliance issues expand to include any hardware or software interacting with cGMP data.

In the past two years, software vendors have been responding to the issues posed by 21CFR11. Audit trails and standardized encryption protocols are built into the configuration capabilities of many systems. We should clarify, however, that though these programs are capable of meeting guidelines for compliance, they must be properly configured and tested to ensure those components are activated. Once configured and tested, any changes to a system impacting production or decisions related to production must be controlled and re-validated to ensure continued compliance.

A list of vendors providing software claiming Part 11 compliance capability can be found at www.fda.gov by clicking on Dockets, clicking again on Dockets, then clicking the link to 00D-1541. In addition to the list, a number of memos are posted from FDA meetings with manufacturers of specific system packages. It should be noted that the publication of the list and/or memos does not imply any FDA endorsement or sanction. If these software systems have the capability for Part 11 compliance, they still must be configured and validated before they comply with FDA regulations.


Understanding the Requirements
The FDA is inclined to look favorably on organizations that have a clearly defined approach to Part 11 compliance with specific activities structured around specific time frames. A Master Plan that analyzes current systems for gaps in compliance requirements and makes recommendations for corrective action is a good first step. But a lack of understanding about exactly what those requirements are has been part of the industry’s reaction to the New Rule.

Significant alarm has been raised over what constitutes an electronic record. Given the volume of electronic materials produced in any given business day, many pharmaceutical firms concluded it would be beyond their ability to comply. This is a misconception. Once a system has been properly configured with electronic identifiers, activated audit trails and security protocols, the volume of records or documents is not a concern. The system will have been qualified to manage that volume and its ability to do so in compliance with Part 11 validated.

The FDA makes no distinction between an electronic document and an electronic record. The easiest way to clarify whether or not an electronic document/record falls under Part 11 is to evaluate its relevance in terms of a hard or printed copy. If the printed record would fall under regulation, the electronic record will most likely need to be Part 11 compliant. Furthering that example, any identifications on the printed record, such as author name, time and date of development or approval must also be replicated and tracked in its electronic counterpart. Merely maintaining paper copies of those electronic records is not sufficient to achieve Part 11 compliance.

Electronic signatures are simply unique identifiers assigned to each individual using or maintaining a system. These are normally user names and passwords, though biometric technology would also apply. These electronic identifiers, or metadata as they are called, must be tracked within the system, leaving a trail that can be audited should a pharmaceutical product’s data or process be called into question. Again, this is similar to the paper process, replacing a filing cabinet full of signed documents with an automated electronic trail. The audit trail must be generated by the system itself, electronically, to be considered compliant.

The requirements for audit trails do not apply to the actions of devices, but to the actions of human operators. Again, there has been much confusion as to when audit trails are required and at what point the trail must start. There are different requirements for different types of records; that is why software marketed as Part 11 compatible cannot be considered compliant until the audit trails and security protocols have been configured to respond to the records it is producing.

Some records house information that is not subject to iteration or draft, such as laboratory test results. The data from each lab test is recorded as taken and clearly is not intended to be modified or changed. Any further testing will be treated as a separate record. The audit trail for this type of record must start in conjunction with the entry of the data. Here again, Part 11 compliance adds value by requiring security protocols to protect the integrity of that data from unauthorized modifications that could sabotage test results or eliminate proof of change.

Other records or documents do require versioning or iterative modification. But at some point in the development process, the document or record is identified as a completed work. This does not mean the document may not be modified in the future. It does mean that the premise and/or data presented in the document are critical to the integrity of future iterations. It is at this point that the audit trail must be activated for that record. All modifications must now be controlled, tracked and logged to clearly lead an inspector back through the development of the data to its original premise and author.

Such records can be built by a single individual or be part of a multi-author project, where a number of users contribute data. Once an electronic signature is affixed to a record that is considered a completed work in its first published iteration, the audit trail is activated. Each user’s data must be protected against unauthorized modifications and all changes tracked and attributed to the proper individual. In the case of unsigned records, such as a secretarial pool transcribing handwritten data, the audit trail will begin when the accuracy of the transcription has been verified and the data published for use. At that point, the approving user will be identified and all future modifications tracked.

Audit trails and security protocols are quite simply quality assurance tools. They not only plot workflow and manage accountability, but also pinpoint performance issues specific to individual users and minimize the potential for liability. Electronic identifiers are designed to function so that changes to records, including updates, deletions and insertions, are clearly tracked by user name and modification date. Attempts at unauthorized modification of data are not only blocked by security protocols, but the electronic identifier is essentially a fingerprint of the individual who is attempting access.


Getting Started
For companies uncertain about how to begin establishing compliance, a comprehensive systems assessment can be the foundation for a Part 11 Master Plan. That assessment starts with an inventory of all computer systems, including integrated hybrid systems, web-enabled interfaces and internal or outsourced data warehousing or archiving systems. In case of inspection, this master list demonstrates an organization’s commitment to a comprehensive approach toward Part 11 compliance.

The assessment must first evaluate each system to determine the risk factors for product safety if a computer error should occur. These risk factors will organize and prioritize the scheduling of compliance activities. But the assessment must go to the next level to truly add value to compliance planning. Each system must be broken down and a gap analysis conducted to identify the actions required to correct non-compliance.

As with all validation documents, it is impossible to be too specific in building your plan. Assessment templates should be standardized to maintain a common look and feel, identifying date and author. The assessment tool should consistently capture criteria for cross-system comparison. For example, a column would identify a Part 11 requirement, such as Security Design Protocols. In that column, specific system functions impacted by that requirement are identified. Observations of the system are noted, along with the corrective actions needed to establish Part 11 compliance. It is imperative to be specific. At the conclusion of the assessment, these actions must be plotted to a schedule to complete the Master Plan. It is possible that people outside the assessment team will manage their execution.

A clear starting point for the assessment pro-cess is systems that al-ready require validation. Obviously these have been identified as having a direct effect on the safety, purity and efficacy of a product. It is safe to assume that these systems should be a priority in the Part 11 Master Plan. If these systems have been properly validated, defined requirements and functions are documented, and their operational and functional performance has already been qualified. This will expedite the Part 11 assessment.

In addition to systems directly impacting production, computer systems that can affect decisions relating to production should also take priority. For example, imagine a scenario regarding a vendor database that is not Part 11 compliant and is managed by the purchasing group. The database contains various information about suppliers, including quality ratings on vendor products. Someone engages the system and inaccurately upgrades the ranking for a vendor providing a specific raw material. There is no way to identify who made the change or if it was approved. Because security protocols have not been established, access is not qualified. Any member of the group, including a temporary employee filling in for a vacationing secretary could have made the change.

Unfortunately, those rankings factor into decisions that affect the safety, purity and efficacy of the company’s products. The amount of sample testing done on a vendor’s raw materials stands in direct correlation to the vendor’s quality ranking. The only way the inaccuracy will be caught is if someone happens to notice the change and question it. Because there is no way to identify when the change was made, if the FDA should question it, huge numbers of product lots would be considered suspect. It’s a big price to pay for a single keystroke from an unidentified employee.

The information captured in the assessment process will also help to establish a foundation for Part 11 Standard Operating Procedures (SOPs), not only for retrospective validations, but also for new system implementations. A system life cycle must include Part 11 considerations in its design, development, testing, implementation and maintenance.


Avoiding Inefficiencies
The biggest inefficiency in computer system operations is the ongoing debate over what must be validated and to what extent. Confusion about Part 11 has only added to the problem. The FDA has developed training on 21 CFR Part 11 for its inspection staff. That training is available through the Freedom of Information Act and may help to educate managers who can then put an end to unproductive infighting and endless rework.

Replication of effort (maintaining a paper system to duplicate electronic records) creates a whole series of issues, the least of which is inefficiency. The FDA specifically states that duplicate records should not be kept unless the law requires a paper record. Firms that choose this tactic are just making more work for themselves. Now they must staff and maintain two record-keeping systems, manual and electronic, and be vigilant in matching changes and updates between the two.

But even if an organization was able to maintain that consistency, it doesn’t change the law. Electronic records require electronic controls (identifiers, audit trails and metadata) even if a paper copy is held. If those controls do not exist or have not been validated, the electronic records are not in compliance. So despite a significant investment of time and money in a replicated work process that is clearly vulnerable to error, these firms have accomplished nothing. They have neither skirted Part 11 requirements nor established compliance.

Managing resistance to validation procedures is another substantial waste of time. Computer system validation requires very thorough written documentation of system operations and performance. Though programmers and IT professionals may balk at what seems to be relentless attention to detail, standards for validation documents must be enforced. These documents can be called into question during an FDA inspection and, if a public safety concern should arise, they support the integrity of product data.

An oft-overlooked imperative for successful compliance is the autonomy of validation professionals. It is impossible for compliance to be properly sustained if the reporting hierarchy of an organization allows validation priorities to be compromised by conflicting goals. As with any other quality assurance process, it is important that validation staff not report directly to system implementers. It is also critical that their role in the project be clearly communicated and their authority consistently supported.

Third-party validation professionals can be an effective solution. Their objectivity protects validation activities from potential conflicts with internal production schedules or goals. It is important to note, however, that if a vendor is also engaged as the system implementer, there must be a clearly demonstrated, auto-nomous reporting structure within the vendor’s team. Validation professionals must work collaboratively, not submissively.


Sustaining Compliance With Change Control
An unanticipated impact from 21CFR11 is the change management issues associated with Change Control for validated systems. Technology has infiltrated and integrated operations that were not previously validation issues for those executing cGMP functions. Suddenly, systems that had run autonomously have become subject to change control. Sales people, using wireless technology to record orders or sample distribution, find their electronic activities subject to Part 11 compliance. Finance systems have been integrated with web-enabled order entry databases, so changes to the financial system parameters must be controlled to ensure they have no impact on the cGMP data. This upheaval in corporate culture has created minor turf wars that must be managed to ensure that consistent compliance is a priority throughout the organization.

Documented controls for managing changes to a validated system are typically created during the testing process. During testing, system development is frozen. Code is considered “shrink-wrapped” and no changes to the system are allowed until testing is complete. The system is moved to a testing area, designed to simulate a “live” environment, including hardware, software functions, data load, interfaces and number of users.

Because the functional requirements of the system have been documented and tested, a baseline for validated performance has been clearly defined. Human interaction with that system, if not controlled, can literally invalidate it by making simple changes that are outside tested parameters. For example, a system is set up to function with unique user names to maintain the proper audit trail. The Help Desk Manager unwittingly assigns the user name “Administrator” to his entire team and they use it interchangeably. The electronic identifiers have been comprised and the system is no longer valid.

Change control not only attempts to maintain the integrity of validation parameters, it tests the impact of even moderate changes on functional performance. The key parameters for that testing are the integrity of the data input, the security of the data while housed and the consistency of the data output. Clearly, system users will make changes to the original data. They will add records and create calculations. Ongoing approvals will be generated and reports compiled. All of these normal day-to-day activities are anticipated changes accounted for with audit trails and security protocols. Still the impact of these changes should be tested and documented to ensure the integrity of data input, housing and output.

Here’s an example of how a simple change to a database, authorized without change control testing, could literally halt an NDA submission. Data-bases store information in tables. Updating those tables is considered a normal activity and is assigned to a specific user. The database in question stored information on environmental conditions in laboratory facilities. After the system was live, the authorized user logged-in and added a new cold room to the database. The cold room was identified as “Cold10” and that label was given to the record. All the environmental data was properly input and properly stored.

But when a report was generated, the database program had been designed to only recognize the first five characters of a record’s label. The data for a room labeled “Cold1” automatically overwrote data for “Cold10.” Obviously, as additional rooms were added, the problem would continue to multiply. Because there had been no testing to gauge the impact of changing a table by adding a room, this seemingly normal and harmless activity could literally derail a product schedule. With the output integrity of the environmental data comprised, the lab results could not be considered valid. This is also a perfect example of how maintaining a paper copy of electronic records does nothing to protect the integrity of electronic data.

It is these types of simple situations that are most often overlooked. Computers have become so commonplace that it is more natural to presume human error over system error. That relaxed attitude, particularly when dealing with legacy systems or data saved to different media, such as tape or disk, can ultimately invalidate compliance. In the scenario above, Part 11 compliance could actually be looked at as damage control. The audit trail would have clearly identified when the new room was added, eliminating any guesswork about what product had been affected and by how much.

Anticipated changes to a system must be tested and users identified and authorized to make those changes within specific parameters. Once that testing has been completed, any unanticipated changes must be controlled and validated as they occur. It is very important that a documented process for controlling those changes be in place and be enforced. As with other validation procedures that change, approval process should generate from an auto-nomous body outside the operation responsible for initiating or executing the change. The QA group would be a typical process owner.

Change control is ongoing by nature and is necessary for any system requiring validated electronic records or signature verification. This includes systems being used by outsourced vendors, whether for clinical trials or hosted data warehousing. A pharmaceutical company is considered accountable (and therefore liable) for the data integrity and electronic signatures of any computer system involved in the development of their products.

It is critical then that any business partner or subcontractor pursues their objectives based on the SOPs of the hiring firm. It is recommended that third-party vendors provide a copy of their system validation documents to QA for review. In addition, a process must be identified and documented for change controls on those systems and updated copies of the resulting documentation forwarded in a timely fashion. As in any legal arena, ignorance is not an adequate defense. Should a public health concern arise, such as unanticipated side effects during trials, the data integrity of computer records at a remote trial site is ultimately the responsibility of the company that owns the product.

Another often-ignored aspect of change control includes upgrades, patches or add-ons provided by a software vendor. Often these upgrades can be downloaded from the Internet or are sent on CD to the IT department, which typically implements them without question. These adds-ons or patches change the parameters of the validated system and therefore fall under change control procedures. Their impact must be tested and validated to ensure that the integrity of the system is not comprised. This is true for any system upgrade, including those identified by the vendor as being Part 11 compliant.


An End to the Myths
The most common misperception of 21CFR11 is that it will create major headaches and expense. While it is true that an investment of time and capital is required to establish compliance, the benefits to the industry are immense. First and foremost, compliance ensures that critical equipment and systems are functioning at peak performance. Second, it clearly identifies each user, establishes hierarchies for data approval or modification, manages accountability and reduces paperwork, storage and handling. Third, it protects system data from unauthorized users who may accidentally or maliciously seek to manipulate or destroy electronic records.

Operating efficiencies can be claimed when misperceptions around compliance are eradicated. Applying technology to manage workflow improves productivity and minimizes delay. A paper process might require the printing of five copies that are routed to five people for their signatures. Depending on availability, it could take as much as a week for those documents to be returned. Internet technology eliminates printing, handling and distribution, so electronic approvals can be generated at any time from anywhere. A five-day process can literally be reduced to one hour.

It is extremely helpful to educate IS/IT leaders and managers on validation guidelines for 21CFR11. Most IT professionals are unclear as to the big picture and can unknowingly commit actions that invalidate a system. Given their role in system implementation and maintenance, it is critical that IS/IT staff be champions of the validation process.

So what is the most important factor in establishing and sustaining compliance to the New Rule? A clear, consistent message from corporate leaders on the value of that compliance, demonstrated by a structured and timely Master Plan to be executed in accordance with documented SOPs for complying to 21 CFR11.

To request the issue in which this story first appeared, go here.

Keep Up With Our Content. Subscribe To Contract Pharma Newsletters